Data Fusion for Real-time Multimodal Emotion Recognition through Webcams and Microphones in E-Learning

نویسندگان

  • Kiavash Bahreini
  • Rob Nadolski
  • Wim Westera
چکیده

This paper describes the validation study of our software that uses combined webcam and microphone data for real-time, continuous, unobtrusive emotion recognition as part of our FILTWAM framework. FILTWAM aims at deploying a real time multimodal emotion recognition method for providing more adequate feedback to the learners through an online communication skills training. Herein, timely feedback is needed that reflects on their shown intended emotions and which is also useful to increase learners’ awareness of their own behaviour. At least, a reliable and valid software interpretation of performed face and voice emotions is needed to warrant such adequate feedback. This validation study therefore calibrates our software. The study uses a multimodal fusion method. Twelve test persons performed computer-based tasks in which they were asked to mimic specific facial and vocal emotions. All test persons’ behaviour was recorded on video and two raters independently scored the showed emotions, which were contrasted with the software recognition outcomes. A hybrid method for multimodal fusion of our multimodal software shows accuracy between 96.1% and 98.6% for the best-chosen WEKA classifiers over predicted emotions. The software fulfils its requirements of real-time data interpretation and reliable results.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

FILTWAM and Voice Emotion Recognition

This paper introduces the voice emotion recognition part of our framework for improving learning through webcams and microphones (FILTWAM). This framework enables multimodal emotion recognition of learners during game-based learning. The main goal of this study is to validate the use of microphone data for a real-time and adequate interpretation of vocal expressions into emotional states were t...

متن کامل

Prediction of asynchronous dimensional emotion ratings from audiovisual and physiological data

Automatic emotion recognition systems based on supervised machine learning require reliable annotation of a↵ective behaviours to build useful models. Whereas the dimensional approach is getting more and more popular for rating a↵ective behaviours in continuous time domains, e. g., arousal and valence, methodologies to take into account reaction lags of the human raters are still rare. We theref...

متن کامل

Evidence Theory-Based Multimodal Emotion Recognition

Automatic recognition of human affective states is still a largely unexplored and challenging topic. Even more issues arise when dealing with variable quality of the inputs or aiming for real-time, unconstrained, and person independent scenarios. In this paper, we explore audio-visual multimodal emotion recognition. We present SAMMI, a framework designed to extract real-time emotion appraisals ...

متن کامل

FILTWAM - A Framework for Online Affective Computing in Serious Games

This paper introduces a Framework for Improving Learning Through Webcams And Microphones (FILTWAM). It proposes an overarching framework comprising conceptual and technical frameworks for enhancing the online communication skills of lifelong learners. Our approach interprets the emotional state of people using webcams and microphones and combines relevant and timely feedback based upon learner'...

متن کامل

Using Emotions to Tag Media

Multimedia information indexing and retrieval is about developing techniques which allow people to effectively find the media they are looking for. Content-based methods become necessary when dealing with big databases due to the limitations inherent in metadata-based systems. Current technology allows researchers to explore the emotional space which is known to carry very interesting semantic ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Int. J. Hum. Comput. Interaction

دوره 32  شماره 

صفحات  -

تاریخ انتشار 2016